Note on an extension of "Davidon" methods to nondifferentiable functions

نویسنده

  • Claude Lemaréchal
چکیده

1. This note summarizes a paper [4] to appear in full elsewhere. It presents an algorithm for the minimization of a general (not necessarily differentiable) convex function. Its central idea is the construction of descent directions as projections of the origin onto the convex hull of previously calculated subgradients as long as satisfactory progress can be made. Using projection to obtain a direction,of movement for the conjugate gradient method is not new [5,8], but previous methods have projected onto the affine manifold spanned by the gradients, rather than onto the convex hull. Alternatively, Demjanov [2] has employed a projection onto the convex hull of a local set of subgradients to obtain a direction of steepest descent. The present algorithm may be said to combine these ideas. It is also connected with proposals of Bertsekas and Mitter [ 1 ] (by Theorem 2 below), and has many points in common with a method due to Wolfe [ 7].

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Note on a method of conjugate subgradients for minimizing nondifferentiable functions

An algorithm is described for finding the minimum of any convex, not necessarily differentiable, function f of several variables. The algorithm yields a sequence of points tending to the solution of the problem, if any, requiring only the calculation o f f and one subgradient o f f at designated points. Its rate of convergence is estimated for convex and for differentiable convex functions. For...

متن کامل

A note on "An interval type-2 fuzzy extension of the TOPSIS method using alpha cuts"

The technique for order of preference by similarity to ideal solution (TOPSIS) is a method based on the ideal solutions in which the most desirable alternative should have the shortest distance from positive ideal solution and the longest distance from negative ideal solution. Depending on type of evaluations or method of ranking, different approaches have been proposing to calculate distances ...

متن کامل

On generalized gradients and optimization

There exists a calculus for general nondifferentiable functions that englobes a large part of the familiar subdifferential calculus for convex nondifferentiable functions [1]. This development started with F.H. Clarke, who introduced a generalized gradient for functions that are locally Lipschitz, but (possibly) nondifferentiable. Generalized gradients turn out to be the subdifferentials, in th...

متن کامل

A Modified Fletcher-Reeves-Type Method for Nonsmooth Convex Minimization

Conjugate gradient methods are efficient for smooth optimization problems, while there are rare conjugate gradient based methods for solving a possibly nondifferentiable convex minimization problem. In this paper by making full use of inherent properties of Moreau-Yosida regularization and descent property of modified conjugate gradient method we propose a modified Fletcher-Reeves-type method f...

متن کامل

An efficient extension of the Chebyshev cardinal functions for differential equations with coordinate derivatives of non-integer order

In this study, an effective numerical method for solving fractional differential equations using Chebyshev cardinal functions is presented. The fractional derivative is described in the Caputo sense. An operational matrix of fractional order integration is derived and is utilized to reduce the fractional differential equations to system of algebraic equations. In addition, illustrative examples...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Math. Program.

دوره 7  شماره 

صفحات  -

تاریخ انتشار 1974